A New Smoothing Nonlinear Conjugate Gradient Method for Nonsmooth Equations with Finitely Many Maximum Functions

نویسندگان

  • Yuan-yuan Chen
  • Yisheng Song
چکیده

and Applied Analysis 3 By Lemmas 4 and 5, we have Lemma 6. Lemma 6. Suppose that Assumption 3 holds, αk is determined by (9), and we get

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

A New Smoothing Conjugate Gradient Method for Solving Nonlinear Nonsmooth Complementarity Problems

In this paper, by using the smoothing Fischer-Burmeister function, we present a new smoothing conjugate gradient method for solving the nonlinear nonsmooth complementarity problems. The line search which we used guarantees the descent of the method. Under suitable conditions, the new smoothing conjugate gradient method is proved globally convergent. Finally, preliminary numerical experiments sh...

متن کامل

Smoothing Nonlinear Conjugate Gradient Method for Image Restoration Using Nonsmooth Nonconvex Minimization

Image restoration problems are often converted into large-scale, nonsmooth and nonconvex optimization problems. Most existing minimization methods are not efficient for solving such problems. It is well-known that nonlinear conjugate gradient methods are preferred to solve large-scale smooth optimization problems due to their simplicity, low storage, practical computation efficiency and nice co...

متن کامل

A Modified Levenberg-Marquardt Method for Nonsmooth Equations with Finitely Many Maximum Functions

For solving nonsmooth systems of equations, the Levenberg-Marquardt method and its variants are of particular importance because of their locally fast convergent rates. Finitely manymaximum functions systems are very useful in the study of nonlinear complementarity problems, variational inequality problems, Karush-Kuhn-Tucker systems of nonlinear programming problems, and many problems in mecha...

متن کامل

A Three-terms Conjugate Gradient Algorithm for Solving Large-Scale Systems of Nonlinear Equations

Nonlinear conjugate gradient method is well known in solving large-scale unconstrained optimization problems due to it’s low storage requirement and simple to implement. Research activities on it’s application to handle higher dimensional systems of nonlinear equations are just beginning. This paper presents a Threeterm Conjugate Gradient algorithm for solving Large-Scale systems of nonlinear e...

متن کامل

On Sequential Optimality Conditions without Constraint Qualifications for Nonlinear Programming with Nonsmooth Convex Objective Functions

Sequential optimality conditions provide adequate theoretical tools to justify stopping criteria for nonlinear programming solvers. Here, nonsmooth approximate gradient projection and complementary approximate Karush-Kuhn-Tucker conditions are presented. These sequential optimality conditions are satisfied by local minimizers of optimization problems independently of the fulfillment of constrai...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2014